National Repository of Grey Literature 11 records found  1 - 10next  jump to record: Search took 0.02 seconds. 
A Comparison Particle Filter for Searching a Radiation Source in Real and Simulated World
Cihlar, Milos ; Lazna, Tomas ; Zalud, Ludek
In this paper, we are focusing on comparing solutionsfor localizing an unknown radiation source in both aGazebo simulator and the real world. A proper simulation ofthe environment, sensors, and radiation source can significantlyreduce the development time of robotic algorithms. We proposeda simple sampling importance resampling (SIR) particle filter.To verify its effectiveness and similarities, we first tested thealgorithm’s performance in the real world and then in the Gazebosimulator. In experiment, we used a 2-inch NaI(Tl) radiationdetector and radiation source Cesium 137 with an activity of 330Mbq. We compared the algorithm process using the evolution ofinformation entropy, variance, and Kullback-Leibler divergence.The proposed metrics demonstrated the similarity between thesimulator and the real world, providing valuable insights toimprove and facilitate further development of radiation searchand mapping algorithms.
Cross-entropy based combination of discrete probability distributions for distributed decision making
Sečkárová, Vladimíra ; Kárný, Miroslav (advisor)
Dissertation abstract Title: Cross-entropy based combination of discrete probability distributions for distributed de- cision making Author: Vladimíra Sečkárová Author's email: seckarov@karlin.mff.cuni.cz Department: Department of Probability and Mathematical Statistics Faculty of Mathematics and Physics, Charles University in Prague Supervisor: Ing. Miroslav Kárný, DrSc., The Institute of Information Theory and Automation of the Czech Academy of Sciences Supervisor's email: school@utia.cas.cz Abstract: In this work we propose a systematic way to combine discrete probability distributions based on decision making theory and theory of information, namely the cross-entropy (also known as the Kullback-Leibler (KL) divergence). The optimal combination is a probability mass function minimizing the conditional expected KL-divergence. The ex- pectation is taken with respect to a probability density function also minimizing the KL divergence under problem-reflecting constraints. Although the combination is derived for the case when sources provided probabilistic type of information on the common support, it can applied to other types of given information by proposed transformation and/or extension. The discussion regarding proposed combining and sequential processing of available data, duplicate data, influence...
Balancing Exploitation and Exploration via Fully Probabilistic Design of Decision Policies
Kárný, Miroslav ; Hůla, František
Adaptive decision making learns an environment model serving a design of a decision policy. The policy-generated actions influence both the acquired reward and the future knowledge. The optimal policy properly balances exploitation with exploration. The inherent dimensionality\ncurse of decision making under incomplete knowledge prevents the realisation of the optimal design.
Cross-entropy based combination of discrete probability distributions for distributed decision making
Sečkárová, Vladimíra ; Kárný, Miroslav (advisor) ; Jurečková, Jana (referee) ; Janžura, Martin (referee)
Dissertation abstract Title: Cross-entropy based combination of discrete probability distributions for distributed de- cision making Author: Vladimíra Sečkárová Author's email: seckarov@karlin.mff.cuni.cz Department: Department of Probability and Mathematical Statistics Faculty of Mathematics and Physics, Charles University in Prague Supervisor: Ing. Miroslav Kárný, DrSc., The Institute of Information Theory and Automation of the Czech Academy of Sciences Supervisor's email: school@utia.cas.cz Abstract: In this work we propose a systematic way to combine discrete probability distributions based on decision making theory and theory of information, namely the cross-entropy (also known as the Kullback-Leibler (KL) divergence). The optimal combination is a probability mass function minimizing the conditional expected KL-divergence. The ex- pectation is taken with respect to a probability density function also minimizing the KL divergence under problem-reflecting constraints. Although the combination is derived for the case when sources provided probabilistic type of information on the common support, it can applied to other types of given information by proposed transformation and/or extension. The discussion regarding proposed combining and sequential processing of available data, duplicate data, influence...
Cross-entropy based combination of discrete probability distributions for distributed decision making
Sečkárová, Vladimíra ; Kárný, Miroslav (advisor)
Dissertation abstract Title: Cross-entropy based combination of discrete probability distributions for distributed de- cision making Author: Vladimíra Sečkárová Author's email: seckarov@karlin.mff.cuni.cz Department: Department of Probability and Mathematical Statistics Faculty of Mathematics and Physics, Charles University in Prague Supervisor: Ing. Miroslav Kárný, DrSc., The Institute of Information Theory and Automation of the Czech Academy of Sciences Supervisor's email: school@utia.cas.cz Abstract: In this work we propose a systematic way to combine discrete probability distributions based on decision making theory and theory of information, namely the cross-entropy (also known as the Kullback-Leibler (KL) divergence). The optimal combination is a probability mass function minimizing the conditional expected KL-divergence. The ex- pectation is taken with respect to a probability density function also minimizing the KL divergence under problem-reflecting constraints. Although the combination is derived for the case when sources provided probabilistic type of information on the common support, it can applied to other types of given information by proposed transformation and/or extension. The discussion regarding proposed combining and sequential processing of available data, duplicate data, influence...
Evaluation of Kullback-Leibler Divergence
Homolová, Jitka ; Kárný, Miroslav
Kullback-Leibler divergence is a leading measure of similarity or dissimilarity of probability distributions. This technical paper collects its analytical and numerical expressions for the broad range of distributions.
Recursive Estimation of High-Order Markov Chains: Approximation by Finite Mixtures
Kárný, Miroslav
A high-order Markov chain is a universal model of stochastic relations between discrete-valued variables. The exact estimation of its transition probabilities suers from the curse of dimensionality. It requires an excessive amount of informative observations as well as an extreme memory for storing the corresponding su cient statistic. The paper bypasses this problem by considering a rich subset of Markov-chain models, namely, mixtures of low dimensional Markov chains, possibly with external variables. It uses Bayesian approximate estimation suitable for a subsequent decision making under uncertainty. The proposed recursive (sequential, one-pass) estimator updates a product of Dirichlet probability densities (pds) used as an approximate posterior pd, projects the result back to this class of pds and applies an improved data-dependent stabilised forgetting, which counteracts the dangerous accumulation of approximation errors.
Approximate Bayesian Recursive Estimation: On Approximation Errors
Kárný, Miroslav ; Dedecius, Kamil
Adaptive systems rely on recursive estimation of a firmly bounded complex- ity. As a rule, they have to use an approximation of the posterior proba- bility density function (pdf), which comprises unreduced information about the estimated parameter. In recursive setting, the latest approximate pdf is updated using the learnt system model and the newest data and then ap- proximated. The fact that approximation errors may accumulate over time course is mostly neglected in the estimator design and, at most, checked ex post. The paper inspects this problem.
Optimální podmínky pro maximalizaci informační divergence exponenciální rodiny
Matúš, František
The information divergence of a probability measure P from an exponential family E over a finite set is defined as infimum of the divergences of P from Q subject to Q in E. All directional derivatives of the divergence from E are explicitly found. To this end, behaviour of the conjugate of a log-Laplace transform on the boundary of its domain is analysed. The first order conditions for P to be a maximizer of the divergence from E are presented, including new ones when P is not projectable to E.
On maximization of the information divergence from an exponential family
Matúš, František ; Ay, N.
The information divergence of a probability measure P from an exponential family E over a finite set is defined as infimum of the divergences of P from Q subject to Q in E. For convex exponential families the local maximizers of this function of P are found. General exponential family E of dimension d is enlarged to an exponential family E* of the dimension at most 3d+2 such that the local maximizers are of zero divergence from E*.

National Repository of Grey Literature : 11 records found   1 - 10next  jump to record:
Interested in being notified about new results for this query?
Subscribe to the RSS feed.